Goto

Collaborating Authors

 stochastic window transformer



Stochastic Window Transformer for Image Restoration

Neural Information Processing Systems

Thanks to the powerful representation capabilities, transformers have made impressive progress in image restoration. However, existing transformers-based methods do not carefully consider the particularities of image restoration. In general, image restoration requires that an ideal approach should be translation-invariant to the degradation, i.e., the undesirable degradation should be removed irrespective of its position within the image. Furthermore, the local relationships also play a vital role, which should be faithfully exploited for recovering clean images. Nevertheless, most transformers either adopt local attention with the fixed local window strategy or global attention, which unfortunately breaks the translation invariance and causes huge loss of local relationships.


Supplemental Material for Stochastic Window Transformer for Image Restoration

Neural Information Processing Systems

Small patch with global attention leads to the broken translation invariance and loss of locality. Under this setting, the broken translation invariance derives from two aspects. Figure 1: The illustration of context of the shifted window strategy and sliding window strategy. T aking expectation boosts performance. PSNR [7] and SSIM [17] based on the luminance channel, i.e., Y channel of YCbCr space.